Optimal storage capacity of quantum Hopfield neural networks

نویسندگان

چکیده

Quantum neural networks form one pillar of the emergent field quantum machine learning. Here generalizations classical realizing associative memories---capable retrieving patterns, or memories, from corrupted initial states---have been proposed. It is a challenging open problem to analyze memories with an extensive number patterns and determine maximal can reliably store, i.e., their storage capacity. In this work, we propose explore general method for evaluating capacity network models. By generalizing what known as Gardner's approach in realm, exploit theory spin glasses deriving optimal quenched pattern variables. As example, apply our open-system memory formed interacting spin-1/2 particles coupled artificial neurons. The system undergoes Markovian time evolution resulting dissipative retrieval dynamics that competes coherent dynamics. We map out nonequilibrium phase diagram study effect temperature Hamiltonian on Our opens avenue systematic characterization memories.

برای دانلود باید عضویت طلایی داشته باشید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Optimal storage capacity of neural networks at finite temperatures

Gardner's analysis of the optimal storage capacity of neural networks is extended to study finite-temperature effects. The typical volume of the space of interactions is calculated for strongly-diluted networks as a function of the storage ratio α, temperature T , and the tolerance parameter m, from which the optimal storage capacity α c is obtained as a function of T and m. At zero temperature...

متن کامل

Storage Capacity of Letter Recognition in Hopfield Networks

Associative memory is a dynamical system which has a number of stable states with a domain of attraction around them [1]. If the system starts at any state in the domain, it will converge to the locally stable state, which is called an attractor. In 1982, Hopfield [2] proposed a fully connected neural network model of associative memory in which patterns can be stored by distributed among neuro...

متن کامل

Enhanced storage capacity with errors in scale-free Hopfield neural networks: An analytical study

The Hopfield model is a pioneering neural network model with associative memory retrieval. The analytical solution of the model in mean field limit revealed that memories can be retrieved without any error up to a finite storage capacity of O(N), where N is the system size. Beyond the threshold, they are completely lost. Since the introduction of the Hopfield model, the theory of neural network...

متن کامل

Storage capacity of two-dimensional neural networks.

We investigate the maximum number of embedded patterns in the two-dimensional Hopfield model. The grand state energies of two specific network states, namely, the energies of the pure-ferromagnetic state and the state of specific one stored pattern are calculated exactly in terms of the correlation function of the ferromagnetic Ising model. We also investigate the energy landscape around them a...

متن کامل

Stochastic Hopfield neural networks

Hopfield (1984 Proc. Natl Acad. Sci. USA 81 3088–92) showed that the time evolution of a symmetric neural network is a motion in state space that seeks out minima in the system energy (i.e. the limit set of the system). In practice, a neural network is often subject to environmental noise. It is therefore useful and interesting to find out whether the system still approaches some limit set unde...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

ژورنال

عنوان ژورنال: Physical review research

سال: 2023

ISSN: ['2643-1564']

DOI: https://doi.org/10.1103/physrevresearch.5.023074